Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
graphql-subscriptions
Advanced tools
The graphql-subscriptions package provides a simple way to add real-time capabilities to your GraphQL server using subscriptions. It allows you to subscribe to specific events and get notified when those events occur, enabling real-time updates in your applications.
PubSub Implementation
The PubSub implementation allows you to create a simple publish-subscribe mechanism. You can publish events and subscribe to them using the PubSub class.
const { PubSub } = require('graphql-subscriptions');
const pubsub = new PubSub();
// Publishing an event
pubsub.publish('EVENT_NAME', { data: 'some data' });
// Subscribing to an event
const subscription = pubsub.asyncIterator('EVENT_NAME');
GraphQL Subscription Setup
This code demonstrates how to set up a GraphQL subscription using the graphql-subscriptions package. It defines a subscription type and uses the PubSub instance to handle the subscription logic.
const { GraphQLObjectType, GraphQLSchema, GraphQLString } = require('graphql');
const { PubSub } = require('graphql-subscriptions');
const pubsub = new PubSub();
const SubscriptionType = new GraphQLObjectType({
name: 'Subscription',
fields: {
messageSent: {
type: GraphQLString,
subscribe: () => pubsub.asyncIterator('MESSAGE_SENT')
}
}
});
const schema = new GraphQLSchema({
subscription: SubscriptionType
});
Triggering Subscriptions
This feature shows how to trigger a subscription event. The sendMessage function publishes an event that can be subscribed to by clients.
const { PubSub } = require('graphql-subscriptions');
const pubsub = new PubSub();
// Function to trigger a subscription
function sendMessage(message) {
pubsub.publish('MESSAGE_SENT', { messageSent: message });
}
// Example usage
sendMessage('Hello, world!');
The subscriptions-transport-ws package provides a WebSocket-based transport for GraphQL subscriptions. It is more focused on the transport layer and integrates well with Apollo Server. Compared to graphql-subscriptions, it offers more control over the WebSocket connection and supports advanced features like connection parameters and custom message handlers.
The graphql-ws package is a lightweight and modern WebSocket implementation for GraphQL subscriptions. It is designed to be simple and efficient, with a focus on performance and ease of use. Compared to graphql-subscriptions, it provides a more streamlined API and better performance for high-throughput applications.
The graphql-yoga package is a fully-featured GraphQL server that includes built-in support for subscriptions. It is designed to be easy to set up and use, with sensible defaults and a focus on developer experience. Compared to graphql-subscriptions, it offers a more integrated solution with less boilerplate code required to get started.
GraphQL subscriptions is a simple npm package that lets you wire up GraphQL with a pubsub system (like Redis) to implement subscriptions in GraphQL.
You can use it with any GraphQL client and server (not only Apollo).
npm install graphql-subscriptions graphql
or yarn add graphql-subscriptions graphql
This package should be used with a network transport, for example subscriptions-transport-ws.
If you are developing a project that uses this module with TypeScript:
tsconfig.json
lib
definition includes "esnext.asynciterable"
npm install @types/graphql
or yarn add @types/graphql
To begin with GraphQL subscriptions, start by defining a GraphQL Subscription
type in your schema:
type Subscription {
somethingChanged: Result
}
type Result {
id: String
}
Next, add the Subscription
type to your schema
definition:
schema {
query: Query
mutation: Mutation
subscription: Subscription
}
Now, let's create a simple PubSub
instance - it is a simple pubsub implementation, based on EventEmitter
. Alternative EventEmitter
implementations can be passed by an options object
to the PubSub
constructor.
import { PubSub } from 'graphql-subscriptions';
export const pubsub = new PubSub();
Now, implement your Subscriptions type resolver, using the pubsub.asyncIterator
to map the event you need:
const SOMETHING_CHANGED_TOPIC = 'something_changed';
export const resolvers = {
Subscription: {
somethingChanged: {
subscribe: () => pubsub.asyncIterator(SOMETHING_CHANGED_TOPIC),
},
},
}
Subscriptions resolvers are not a function, but an object with
subscribe
method, that returnsAsyncIterable
.
Now, the GraphQL engine knows that somethingChanged
is a subscription, and every time we use pubsub.publish
over this topic - it will publish it using the transport we use:
pubsub.publish(SOMETHING_CHANGED_TOPIC, { somethingChanged: { id: "123" }});
Note that the default PubSub implementation is intended for demo purposes. It only works if you have a single instance of your server and doesn't scale beyond a couple of connections. For production usage you'll want to use one of the PubSub implementations backed by an external store. (e.g. Redis)
When publishing data to subscribers, we need to make sure that each subscriber gets only the data it needs.
To do so, we can use withFilter
helper from this package, which wraps AsyncIterator
with a filter function, and lets you control each publication for each user.
withFilter
API:
asyncIteratorFn: (rootValue, args, context, info) => AsyncIterator<any>
: A function that returns AsyncIterator
you got from your pubsub.asyncIterator
.filterFn: (payload, variables, context, info) => boolean | Promise<boolean>
- A filter function, executed with the payload (the published value), variables, context and operation info, must return boolean
or Promise<boolean>
indicating if the payload should pass to the subscriber.For example, if somethingChanged
would also accept a variable with the ID that is relevant, we can use the following code to filter according to it:
import { withFilter } from 'graphql-subscriptions';
const SOMETHING_CHANGED_TOPIC = 'something_changed';
export const resolvers = {
Subscription: {
somethingChanged: {
subscribe: withFilter(() => pubsub.asyncIterator(SOMETHING_CHANGED_TOPIC), (payload, variables) => {
return payload.somethingChanged.id === variables.relevantId;
}),
},
},
}
Note that when using
withFilter
, you don't need to wrap your return value with a function.
You can map multiple channels into the same subscription, for example when there are multiple events that trigger the same subscription in the GraphQL engine.
const SOMETHING_UPDATED = 'something_updated';
const SOMETHING_CREATED = 'something_created';
const SOMETHING_REMOVED = 'something_removed';
export const resolvers = {
Subscription: {
somethingChanged: {
subscribe: () => pubsub.asyncIterator([ SOMETHING_UPDATED, SOMETHING_CREATED, SOMETHING_REMOVED ]),
},
},
}
You can also manipulate the published payload, by adding resolve
methods to your subscription:
const SOMETHING_UPDATED = 'something_updated';
export const resolvers = {
Subscription: {
somethingChanged: {
resolve: (payload, args, context, info) => {
// Manipulate and return the new value
return payload.somethingChanged;
},
subscribe: () => pubsub.asyncIterator(SOMETHING_UPDATED),
},
},
}
Your database might have callback-based listeners for changes, for example something like this:
const listenToNewMessages = (callback) => {
return db.table('messages').listen(newMessage => callback(newMessage));
}
// Kick off the listener
listenToNewMessages(message => {
console.log(message);
})
The callback
function would be called every time a new message is saved in the database. Unfortunately, that doesn't play very well with async iterators out of the box because callbacks are push-based, where async iterators are pull-based.
We recommend using the callback-to-async-iterator
module to convert your callback-based listener into an async iterator:
import asyncify from 'callback-to-async-iterator';
export const resolvers = {
Subscription: {
somethingChanged: {
subscribe: () => asyncify(listenToNewMessages),
},
},
}
AsyncIterator
WrappersThe value you should return from your subscribe
resolver must be an AsyncIterator
.
You can use this value and wrap it with another AsyncIterator
to implement custom logic over your subscriptions.
For example, the following implementation manipulate the payload by adding some static fields:
import { $$asyncIterator } from 'iterall';
export const withStaticFields = (asyncIterator: AsyncIterator<any>, staticFields: Object): Function => {
return (rootValue: any, args: any, context: any, info: any): AsyncIterator<any> => {
return {
next() {
return asyncIterator.next().then(({ value, done }) => {
return {
value: {
...value,
...staticFields,
},
done,
};
});
},
return() {
return Promise.resolve({ value: undefined, done: true });
},
throw(error) {
return Promise.reject(error);
},
[$$asyncIterator]() {
return this;
},
};
};
};
You can also take a look at
withFilter
for inspiration.
For more information about AsyncIterator
:
It can be easily replaced with some other implementations of PubSubEngine abstract class. Here are a few of them:
You can also implement a PubSub
of your own, by using the exported abstract class PubSubEngine
from this package. By using extends PubSubEngine
you use the default asyncIterator
method implementation; by using implements PubSubEngine
you must implement your own AsyncIterator
.
SubscriptionManager
is the previous alternative for using graphql-js
subscriptions directly, and it's now deprecated.
If you are looking for its API docs, refer to a previous commit of the repository
1.2.1
withFilter
. PR #209FAQs
GraphQL subscriptions for node.js
The npm package graphql-subscriptions receives a total of 487,890 weekly downloads. As such, graphql-subscriptions popularity was classified as popular.
We found that graphql-subscriptions demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.